Goto

Collaborating Authors

 self-driving mode


Tesla involved in fatal Washington crash was using self-driving mode

Engadget

A deadly accident in Washington that took the life of a motorcyclist earlier this year was caused by a Tesla vehicle while it was in "Full Self Driving" mode. The Associated Press reported that investigators from the Washington State Patrol confirmed that a 2022 Tesla Model S involved in the fatality accident in April was in self-driving mode from the car's event-data recorder. The accident occurred on April 19 on the eastbound side of State Route 522 approximately 15 miles northeast of Seattle. The unidentified driver told police he had his Tesla's self-driving mode on and was looking at his phone at the time of the crash. The vehicle crashed into the back of the motorcycle pinning Jeffrey Nissen, 28, underneath the vehicle.

  Country: North America > United States > Washington (0.27)
  Industry: Transportation > Ground > Road (1.00)

Tesla driver using self-driving mode slammed into police cruiser in Orange County

Los Angeles Times

A Tesla driver using the vehicle's self-driving mode crashed into a police car Thursday morning in Fullerton, almost hitting an officer who was investigating another crash, according to authorities. A Fullerton Police Department officer was investigating a fatal crash around 12:04 a.m. The officer was managing traffic at the time and emergency flares had been placed on the road. The officer was standing outside his patrol vehicle, with its emergency lights on, and managed to jump out of the way before the driver of a blue Tesla crashed into his car, authorities said. A police dispatcher, who was riding in the patrol vehicle, also moved out of the way of the crash.


UK unveils £40m innovation fund for self-driving buses and vans

#artificialintelligence

You could soon see self-driving buses and delivery vans on UK roads as the government launches a £40m ($50m) competition to bring this technology to the market. The funding to kick-start commercial self-driving services, such as delivery vehicles and passenger shuttles, will help bring together companies and investors so that sustainable business models to be rolled out nationally and exported globally. The Commercialising Connected and Automated Mobility competition will provide grants to help roll out commercial use self-driving vehicles across the UK from 2025. Types of self-driving vehicles that could be deployed include delivery vans, passenger buses, shuttles and pods, as well as vehicles that move people and luggage at airports and containers at shipping ports. The competition aims to unlock a new industry that could be worth £42bn to the UK economy by 2035, potentially creating 38,000 new skilled jobs.


A self-driving revolution? We're barely out of second gear John Naughton

The Guardian

"Britain moves closer to a self-driving revolution," said a perky message from the Department for Transport that popped into my inbox on Wednesday morning. The purpose of the message was to let us know that the government is changing the Highway Code to "ensure the first self-driving vehicles are introduced safely on UK roads" and to "clarify drivers' responsibilities in self-driving vehicles, including when a driver must be ready to take back control". The changes will specify that while travelling in self-driving mode, motorists must be ready to resume control in a timely way if they are prompted to, such as when they approach motorway exits. They also signal a puzzling change to current regulations, allowing drivers "to view content that is not related to driving on built-in display screens while the self-driving vehicle is in control". So you could watch Gardeners' World on iPlayer, but not YouTube videos of F1 races? Reassuringly, though, it will still be illegal to use mobile phones in self-driving mode, "given the greater risk they pose in distracting drivers as shown in research".


Tesla recalls 50,000 cars that disobey stop signs in self-driving mode

New Scientist

Tesla is recalling more than 50,000 cars in the US because the AI behind its self-driving feature acted too aggressively, rolling past stop signs rather than coming fully to a halt as required by law in many states. The company's Full Self-Driving code had been continuing through stop signs at up to 5.6 miles per hour. The US National Highway Traffic Safety Administration (NHTSA) issued a safety recall for 53,822 cars that are currently running firmware version 2020.40.4.10, which contains the "rolling stop" feature. Tesla has agreed to disable it remotely on all affected cars, including its Model 3, Model S, Model X and Model Y. The rolling stop feature was introduced by Tesla in October, but the NHTSA says that failing to come to a complete stop "may increase the risk of collision".


Tesla will hike prices on self-driving mode, again

Engadget

Tesla's "full self-driving" (FSD) feature has had something of a rocky history, to put things generously. It's been implicated in multiple crashes, seemingly persistent software bugs and a cavalcade of scrutiny from a panoply of regulatory bodies. Also, it's about to cost more money. The bump represents an additional $2,000 being tacked onto the not insubstantial price tag: a new grand total of $12,000, or most of the way to a Honda Civil. Nor is it the first time FSD has gotten more expensive.

  Country: North America > United States (0.21)
  Industry:

What about regulation for self-driving vehicles? - Marketplace

#artificialintelligence

On Wednesday's show we talked about Tesla's full self-driving mode, which it is about to make available to more drivers. And yes, the name implies that the cars will drive themselves. A human will still have to be in control. And that's where we are right now with self-driving cars. They might help you drive, but they might also make a mistake that causes an accident if you're not paying attention.


Artificial Intelligence Is Not Enough to Create Fully Autonomous Cars For Now -- Here's Why Analytics Insight

#artificialintelligence

Artificial intelligence is arguably the most disruptive piece of technology that we're currently developing. There are many aspects of life that could change drastically once it becomes a mainstream product. In a lot of these aspects, artificial intelligence, when coupled with robotics, may even completely replace their human counterparts in certain industries. One aspect that comes to mind is that of driving. And with the way that so many tech companies seem to be putting a significant amount of time and research into developing a fully autonomous car, it would seem that they are completely adamant about rolling out such a product.


Waymo Collision Illustrates Why Society Might Eventually Ban Human Driving

#artificialintelligence

On October 19, a Waymo Pacifica struck and injured a motorcyclist in California. As is often the case, the collision was caused by a human - in this instance, the safety driver in the Waymo vehicle. In an unusual twist, however, Waymo CEO John Krafcik revealed that if the safety operator had not taken control of the autonomous minivan, then the self-driving software would have avoided a collision. Our simulation shows the self-driving system would have responded to the passenger car by reducing our vehicle's speed, and nudging slightly in our own lane, avoiding a collision." Waymo Autonomous Vehicle ("WaymoAV") was traveling at approximately 21 MPH westbound in Lane 2 of El Camino Real in Mountain View in self-driving mode. A passenger vehicle in Lane 1, to the left of the Waymo AV, began to change lanes into Lane2 to avoid a box truck blocking two lanes of traffic, Waymo's test driver took manual control of the AV out of an abundance of caution, disengaged from self-driving mode, and began changing lanes into Lane 3. A motorcycle, traveling at approximately 28 MPH behind the Waymo AV, had just entered Lane 3 to overtake the Waymo AV on its right. The motorcyclist reported injuries and was transported to the hospital for treatment. The Waymo AV sustained minor damage to the rear bumper."


Self-driving Uber likely killed woman because it ignored her

Daily Mail - Science & tech

Uber's self-driving technology software detected a woman as she was crossing the street with her bicycle in Arizona in March but failed to react immediately before she was fatally hit by an autonomous vehicle, according to the results of an internal investigation. The cameras, Lidar, and radar were all working properly on the semi-autonomous Volvo SUV as it was driving at normal speed on a highway in Tempe on the night of March 18. But the system did not react when it detected a woman walking across the highway since it was programmed to treat passing objects on the road such as plastic bags as'false positives' that ought to be ignored, according to the results of Uber's preliminary probe. Uber's self-driving technology software detected a woman as she was crossing the street with her bicycle in Arizona in March but failed to react immediately before she was fatally hit by an autonomous vehicle, according to the results of an internal investigation The Volvo SUV was in self-driving mode with a human back-up operator behind the wheel when a woman walking a bicycle was hit. Elaine Herzberg, 49, died in hospital.